† Corresponding author. E-mail:
Project supported by the National Natural Science Foundation of China (Grant Nos. 61503338, 61573316, 61374152, and 11302195) and the Natural Science Foundation of Zhejiang Province, China (Grant No. LQ15F030005).
In this paper, a novel design procedure is proposed for synthesizing high-capacity auto-associative memories based on complex-valued neural networks with real-imaginary-type activation functions and constant delays. Stability criteria dependent on external inputs of neural networks are derived. The designed networks can retrieve the stored patterns by external inputs rather than initial conditions. The derivation can memorize the desired patterns with lower-dimensional neural networks than real-valued neural networks, and eliminate spurious equilibria of complex-valued neural networks. One numerical example is provided to show the effectiveness and superiority of the presented results.
Since Hopfield’s inspirational work on the stability of the Hopfield neural network using an energy function,[1] numerous studies on the dynamics of various neural networks have been reported.[2–13] Due to strong learning ability, high classification and prediction accuracy, the dynamics of neural networks have been investigated extensively to achieve associative memory and pattern recognition. Design procedures for synthesizing associative memories have been proposed based on real-valued neural networks (RVNNs)[14–20] and complexvalued neural networks (CVNNs).[21–26]
In the design procedures based on RVNNs, coexistence of many equilibria is a necessary feature in the applications of neural networks to associative memory storage or pattern recognition. Extensive investigations have been done to study the existence and stability of multiple equilibria of neural networks.[27–33] Due to the dependence on initial states of neural networks, these works resulted in the appearance of spurious equilibria. It is difficult to avoid spurious equilibria, and accurate pattern recalling cannot be guaranteed by using local stability of neural networks. In order to solve this problem, design procedures were presented for synthesizing associative memories based on RVNNs by assuring that each trajectory globally converged to a unique equilibrium point, which depended on the external inputs of neural networks, for example, see Refs. [15], [16], [19], and [20]. These designed networks defined a nonlinear mapping from the space of external inputs to that of steady state outputs. In the design procedures based on CVNNs, neural networks have been discussed as multistate associative memory models by employing a class of amplitude-phase-type activation functions, i.e., multilevel sigmoid functions. Although such dynamics are favorable for gray-level image reconstruction, it often suffers from convergence to an undesirable spurious pattern. Most works have been done to overcome this shortcoming.[21,23,24] As is well known, it is more efficient to avoid spurious patterns when the convergence of neural networks depends on external inputs instead of initial states in associative memories. However, to the best of the authors’ knowledge, less attention has been paid to designing associative memory procedures based on CVNNs dependent on external inputs.
It is well known that the activation functions play an important role in the dynamical analysis of neural networks. In RVNNs, the activation functions are usually chosen as smooth and bounded functions. However, this is not suitable in CVNNs. Since every bounded entire function must be constant in the complex domain according to Liouville’s theorem. Therefore, the choice of activation functions is the main challenge for CVNNs. Two kinds of complex-valued activation functions have been presented in previous studies, i.e., real-imaginary- type activation functions[34] and amplitude-phase-type activation functions.[23,25] Many works have been done to study the design procedure of associative memories based on CVNNs with amplitude-phase-type activation functions[21–25] and the convergence of CVNNs with real-imaginary-type activation functions.[6,34] To the best of the authors’ knowledge, less associative memory results based on CVNNs with real-imaginary-type activation functions have been published. In fact, this is a very meaningful topic. First, the real-imaginary-type activation functions are suitable for dealing with complex information that must have symmetry concerning, or a certain special meaning on the real and imaginary axes. Second, the real parts and the imaginary parts of activation functions can be some functions with multiple constant values, which is helpful in realizing the coexistence of multiple equilibrium points of the neural networks. Moreover, one of the advantages of using the real-imaginary-type activation functions is that the whole CVRNNs can be analyzed by dealing with real and imaginary parts separately and independently, which reduces the complexity of computation and analysis.
In electronic implementation of analog neural networks, time delays always exist due to the transmission of signal and the finite switching speed of amplifiers. The existence of time delays may lead to instability and oscillation in a neural network. Therefore, it is necessary to introduce delays into neural network models to synthesize the associative memories.
Motivated by the above discussions, the main goal of this paper is to investigate the design procedure of auto-associative memories based on complex-valued neural networks with real-imaginary-type activation functions and constant delays. The first main work is that we construct a class of the real-imaginary-type activation functions, which are piecewise linear functions corresponding to their real parts and imaginary parts, respectively. Secondly, four lemmas are presented to track the dynamics of solution flows of CVNNs. Based on these lemmas, global exponential stability criteria of CVNNs are established, which depend on external inputs of neural networks. Moreover, one high-capacity design procedure is proposed for synthesizing auto-associative memories. There are two difficulties on analysis and design of high-capacity auto-associative memories based on CVNNs with the real-imaginary-type activation functions.
Compared with the existing relevant results, the main advantages of this paper include the following two points.
The remainder of this paper consists of the following sections. Section 2 describes the problem formulations. The main results are presented in Section 3. Section 4 gives an example to show the effectiveness of the obtained results. Finally, in Section 5, the conclusion is drawn.
Denote {−1,1}2n as the set of 2n-dimensional bipolar vectors, i.e., {−1,1}2n = {x ∈ ℝ2n, x = (x1,x2,…,x2n)T, xi = 1 or −1, i = 1,2,…,2n}. Denote i as the imaginary unit, that is
Consider a complex-valued neural network with multiple delays described by
Let
In order to synthesize the auto-associative memory in the design problem of this paper, the following hypothesis about the activation function is needed.
Since the real part and imaginary part of the activation function under Hypothesis 1 have individual properties, we can separate system (
The definition of the exponential stability is given as follows.
In order to obtain the main results, we need the following lemma.
The purpose of this paper is to design an auto-associative memory based on CVNNs. Firstly, we will introduce four lemmas according to Ref. [29]. Through these four lemmas, the dynamics of each state component of system (
The proofs of Lemmas 2–5 are similar to the proof of Lemma 2 in Ref. [29], and thus are omitted here for brevity.
Denote four index subsets as follows:
Now, we will show the existence of the globally exponentially stable equilibrium point of system (
The neuron state will stay in them afterwards. That is, when the state z(t) of system (
For the case of system (
In particular, when yi = 0,
Based on the above discussion, one can obtain the following design procedure of auto-associative memory.
In the application of associative memory, the parameters of CVNNs are designed according to the information about pairwise vectors of desired patterns and inputs. In this paper, the input vectors are designed to be the same as the vectors of desired patterns. According to the above design procedure, the parameters D, A, B can be designed. Hence, CVNNs can be derived. Once the neural networks are designed, the parameters will not change. The stable output vectors are only relevant to the input vectors. If one input is fed as a probe, the corresponding desired pattern will be memorized. It can be noticed that the choice of the parameters of neural networks depends on some inequalities. Therefore, the parameters can be changed in a certain range, which do not affect the result of associative memories.
In this section, we will give one example to illustrate the effectiveness of our results.
Since the external inputs are equal to the desired patterns in the auto-associative memory, we can give the input vectors as follows:
When the input probe is set to be I(1), the desired pattern α(1) will be memorized. When the input probe is set to be I(2), the desired pattern α(2) will be memorized. Simulations show the effectiveness of the results with four random initial values in Figs.
In this paper, a new design procedure for synthesizing auto-associative memories has been proposed based on CVNNs with real-imaginary-type activation functions and constant delays. Global exponential stability criteria have been derived. The stability results depend on external inputs of neural networks. Compared with previous results, the design method for synthesizing auto-associative memories is suitable for dealing with complex information that must have symmetry concerning, or a certain special meaning on the real and imaginary axes in the applications to associative memory, eliminates spurious memory patterns, relaxes the relationship of parameters of recurrent neural networks, and has high storage capacity of auto-associative memories.
Several extensions would be welcome as future work: